Efficient Variational Inference for Gaussian Process Regression Networks

نویسندگان

  • Trung V. Nguyen
  • Edwin V. Bonilla
چکیده

In multi-output regression applications the correlations between the response variables may vary with the input space and can be highly non-linear. Gaussian process regression networks (GPRNs) are flexible and effective models to represent such complex adaptive output dependencies. However, inference in GPRNs is intractable. In this paper we propose two efficient variational inference methods for GPRNs. The first method, gprn-mf, adopts a mean-field approach with full Gaussians over the GPRN’s parameters as its factorizing distributions. The second method, gprn-npv, uses a nonparametric variational inference approach. We derive analytical forms for the evidence lower bound on both methods, which we use to learn the variational parameters and the hyperparameters of the GPRN model. We obtain closed-form updates for the parameters of gprn-mf and show that, while having relatively complex approximate posterior distributions, our approximate methods require the estimation of O(N) variational parameters rather than O(N) for the parameters’ covariances. Our experiments on real data sets show that gprn-npv may give a better approximation to the posterior distribution compared to gprn-mf, in terms of both predictive performance and stability.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gaussian Process Regression Networks

We introduce a new regression framework, Gaussian process regression networks (GPRN), which combines the structural properties of Bayesian neural networks with the nonparametric flexibility of Gaussian processes. This model accommodates input dependent signal and noise correlations between multiple response variables, input dependent length-scales and amplitudes, and heavy-tailed predictive dis...

متن کامل

Fast Bayesian Inference for Non-Conjugate Gaussian Process Regression

We present a new variational inference algorithm for Gaussian process regression with non-conjugate likelihood functions, with application to a wide array of problems including binary and multi-class classification, and ordinal regression. Our method constructs a concave lower bound that is optimized using an efficient fixed-point updating algorithm. We show that the new algorithm has highly co...

متن کامل

Asynchronous Distributed Variational Gaussian Process for Regression

Gaussian processes (GPs) are powerful nonparametric function estimators. However, their applications are largely limited by the expensive computational cost of the inference procedures. Existing stochastic or distributed synchronous variational inferences, although have alleviated this issue by scaling up GPs to millions of samples, are still far from satisfactory for real-world large applicati...

متن کامل

Variational Gaussian process classifiers

Gaussian processes are a promising nonlinear regression tool, but it is not straightforward to solve classification problems with them. In this paper the variational methods of Jaakkola and Jordan are applied to Gaussian processes to produce an efficient Bayesian binary classifier.

متن کامل

Doubly Stochastic Variational Bayes for non-Conjugate Inference

We propose a simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate inference in continuous parameter spaces. This algorithm is based on stochastic approximation and allows for efficient use of gradient information from the model joint density. We demonstrate these properties using illustrative examples as well...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013